Continuously Evolving Programs in Genetic Programming Using Gradient Descent
نویسندگان
چکیده
This paper describes an approach to the use of gradient descent search in genetic programming for continuously evolving genetic programs for object classification problems. An inclusion factor is introduced to each node in a genetic program and gradient descent search is applied to the inclusion factors. Three new on-zero operators and two new continuous genetic operators are developed for evolution. This approach is examined and compared with a basic GP approach on three object classification problems of varying difficulty. The results suggest that the new approach can evolve genetic programs continuously. The new method which uses the standard genetic operators and gradient descent applied to the inclusion factors substantially outperforms the basic GP approach which uses the standard genetic operators but does not use the gradient descent and inclusion factors. However, the new method with the continuous operators and the gradient descent on inclusion factors decreases the performance on all the problems.
منابع مشابه
cient Learning Through Evolution : Neural Programming and Internal
Genetic programming (GP) can learn complex concepts by searching for the target concept through evolution of population of candidate hypothesis programs. However, unlike some learning techniques, such as Artiicial neural networks (ANNs), GP does not have a principled procedure for changing parts of a learned structure based on that structure's performance on the training data. GP is missing a c...
متن کاملNeural Programming and an Internal Reinforcement Policy
An important reason for the continued popularity of Artificial Neural Networks (ANNs) in the machine learning community is that the gradient-descent backpropagation procedure gives ANNs a locally optimal change procedure and, in addition, a framework for understanding the ANN learning performance. Genetic programming (GP) is also a successful evolutionary learning technique that provides powerf...
متن کاملFaster Genetic Programming based on Local Gradient Search of Numeric Leaf Values
We examine the effectiveness of gradient search optimization of numeric leaf values for Genetic Programming. Genetic search for tree-like programs at the population level is complemented by the optimization of terminal values at the individual level. Local adaptation of individuals is made easier by algorithmic differentiation. We show how conventional random constants are tuned by gradient des...
متن کاملForecasting GDP Growth Using ANN Model with Genetic Algorithm
Applying nonlinear models to estimation and forecasting economic models are now becoming more common, thanks to advances in computing technology. Artificial Neural Networks (ANN) models, which are nonlinear local optimizer models, have proven successful in forecasting economic variables. Most ANN models applied in Economics use the gradient descent method as their learning algorithm. However, t...
متن کاملHandwritten Character Recognition using Modified Gradient Descent Technique of Neural Networks and Representation of Conjugate Descent for Training Patterns
The purpose of this study is to analyze the performance of Back propagation algorithm with changing training patterns and the second momentum term in feed forward neural networks. This analysis is conducted on 250 different words of three small letters from the English alphabet. These words are presented to two vertical segmentation programs which are designed in MATLAB and based on portions (1...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2004